Home
Jobs

172 Snowflake Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - Pyspark/Python Data Engineer! We are looking for a passionate Python developer to join our team at Genpact. You will be responsible for developing and implementing high-quality software solutions for data transformation and analytics using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance, and professional growth and development opportunities. Responsibilities . Develop, test and maintain high-quality solutions using PySpark /Python programming language. . Participate in the entire software development lifecycle, building, testing and delivering high-quality data pipelines. . Collaborate with cross-functional teams to identify and solve complex problems. . Write clean and reusable code that can be easily maintained and scaled. . Keep up to date with emerging trends and technologies in Python development. Qualifications we seek in you! Minimum qualifications . years of experience as a Python Developer with a strong portfolio of projects. . Bachelor%27s degree in Computer Science, Software Engineering or a related field. . Experience on developing pipelines on cloud platforms such as AWS or Azure using AWS Glue or ADF . In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, Great Expectations, Splink and PyTorch. . Experience with data platforms such as Databricks/ Snowflake . Experience with front-end development using HTML or Python. . Familiarity with database technologies such as SQL and NoSQL. . Excellent problem-solving ability with solid communication and collaboration skills. Preferred skills and qualifications . Experience with popular Python frameworks such as Django, Flask, FastAPI or Pyramid. . Knowledge of GenAI concepts and LLMs. . Contributions to open-source Python projects or active involvement in the Python community. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant- Snowflake Data Engineer ( Snowflake+ Python+Cloud ) ! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL , Bulk copy, Snowpipe , Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight , Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/ Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark . Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/ PySpark , AWS/Azure, ETL concepts, & Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in datamarts using Snowflake, DBT, Unix, SQL technologies. Redesign Control M Batch processing for the ETL job build to run efficiently in Production. Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow. Responsibilities: Perform requirements identification conduct business program analysis, testing, and system enhancements while providing production support. Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart. Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Responsible to develop triggers, functions, stored procedures to support this effort Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes. Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Responsible to develop triggers, functions, stored procedures to support this effort Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes. Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A, with Minimum 5 plus years of experience Must Have : Snowflake, AWS, Complex SQL Experience with software architecture in cloud-based infrastructures Experience in ETL processes and data modelling techniques Experience in designing and managing large scale data warehouses Preferred technical and professional experience Good to have in addition to Must haves: DBT, Tableau, python, JavaScript Develop complex SQL queries for data analytics and business intelligence. Background working with data analytics, business intelligence, or related fields

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Data Strategy and Planning: Develop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data Modeling: Design and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and Management: Oversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data Integration: Define and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Cloud & Data Architecture: AWS , Snowflake ETL & Data Engineering: AWS Glue, Apache Spark, Step Functions Big Data & Analytics: Athena,Presto, Hadoop Database & Storage: SQL, Snow sql Security & Compliance: IAM, KMS, Data Masking Preferred technical and professional experience Cloud Data Warehousing: Snowflake (Data Modeling, Query Optimization) Data Transformation: DBT (Data Build Tool) for ELT pipeline management Metadata & Data Governance: Alation (Data Catalog, Lineage, Governance

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. The IBM Client Innovation Center (CIC) is an innovative and exciting part of IBM, working as IBM Global Business Services technical delivery partner. We continually innovate for our customers, employees, and shareholders. Our Technical Skills range from Application Development, Testing Services, Technical Support through to Cloud and Artificial Intelligence. Your work will power IBM and its clients globally, collaborating and integrating code and processes into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact! In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in clients environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging. Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls. Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships).

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in datamarts using Snowflake, DBT, Unix, SQL technologies. Redesign Control M Batch processing for the ETL job build to run efficiently in Production. Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow. Responsibilities: Perform requirements identification conduct business program analysis, testing, and system enhancements while providing production support. Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart. Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Responsible to develop triggers, functions, stored procedures to support this effort Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes. Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Responsible to develop triggers, functions, stored procedures to support this effort Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes. Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A, with Minimum 5 plus years of experience Must Have : Snowflake, AWS, Complex SQL Experience with software architecture in cloud-based infrastructures Experience in ETL processes and data modelling techniques Experience in designing and managing large scale data warehouses Preferred technical and professional experience Good to have in addition to Must haves: DBT, Tableau, python, javascript Develop complex SQL queries for data analytics and business intelligence. Background working with data analytics, business intelligence, or related fields

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

Posted 2 weeks ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Job description Professional & Technical Skills: - Must To Have Skills: Strong experience in Snowflake Data Warehouse. - Good To Have Skills: Experience in other data warehousing technologies such as Redshift, BigQuery, or Azure Synapse Analytics. - Experience in designing and developing applications using Snowflake Data Warehouse. - Strong understanding of data warehousing concepts and best practices. - Experience in working with cross-functional teams to deliver high-quality solutions. - Excellent communication and interpersonal skills.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

Bhubaneswar, Odisha, India

On-site

Foundit logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Snowflake Data Warehouse. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable and efficient solutions. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Snowflake Data Warehouse. Collaborate with cross-functional teams to analyze business requirements and develop scalable and efficient solutions. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Ensure the quality of the application by conducting unit testing, integration testing, and performance testing. Professional & Technical Skills: Must To Have Skills: Experience in Snowflake Data Warehouse. Good To Have Skills: Experience in other data warehousing technologies like Redshift, BigQuery, or Azure Synapse Analytics. Strong understanding of database concepts and SQL. Experience in ETL tools like Talend, Informatica, or DataStage. Experience in developing and maintaining technical documentation. Experience in conducting unit testing, integration testing, and performance testing. Additional Information: The candidate should have experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Thiruvananthapuram / Trivandrum, Kerala, India

On-site

Foundit logo

Snowflake Data Warehouse Development: Design, implement, and optimize data warehouses on the Snowflake cloud platform. Ensure the effective utilization of Snowflake's features for scalable, efficient, and high-performance data storage and processing. Data Pipeline Development: Develop, implement, and optimize end-to-end data pipelines on the Snowflake platform. Design and maintain ETL workflows to enable seamless data processing across systems. Data Transformation with PySpark: Leverage PySpark for data transformations within the Snowflake environment. Implement complex data cleansing, enrichment, and validation processes using PySpark to ensure the highest data quality. Collaboration: Work closely with cross-functional teams to design data solutions aligned with business requirements. Engage with stakeholders to understand business needs and translate them into technical solutions.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Cochin / Kochi / Ernakulam, Kerala, India

On-site

Foundit logo

Snowflake Data Warehouse Development: Design, implement, and optimize data warehouses on the Snowflake cloud platform. Ensure the effective utilization of Snowflake's features for scalable, efficient, and high-performance data storage and processing. Data Pipeline Development: Develop, implement, and optimize end-to-end data pipelines on the Snowflake platform. Design and maintain ETL workflows to enable seamless data processing across systems. Data Transformation with PySpark: Leverage PySpark for data transformations within the Snowflake environment. Implement complex data cleansing, enrichment, and validation processes using PySpark to ensure the highest data quality. Collaboration: Work closely with cross-functional teams to design data solutions aligned with business requirements. Engage with stakeholders to understand business needs and translate them into technical solutions.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.

Posted 2 weeks ago

Apply

11.0 - 21.0 years

11 - 21 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Cloud knowledge, especially AWS knowledge is necessary Well-versed with best practices around Data Governance, Data Stewardship and overall Data Quality initiatives Inventories existing data design, including data flows and systems, Designs data Model that integrates new and existing environment, Develops Conducts architecture meetings with Team leads and Solution Architects to communicate complex ideas, issues, and system processes along with Architecture discussions in their current projects Design the data warehouse and provide guidance to the team in implementation using Snowflake SnowSQL. Good hands-on experience in converting Source Independent Load, Post Load Process, Stored Procedure, SQL to Snowflake Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the snow pipelines and should be able to trouble shoot the issue quickly 1 yr Work experience in DBT. Data modeling and data integration. Advance SQL skills for analysis, standardizing queries Mandatory Skillset: Snowflake, DBT, Data Architecture Design experience in Data Warehouse.

Posted 2 weeks ago

Apply

11.0 - 21.0 years

11 - 21 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Cloud knowledge, especially AWS knowledge is necessary Well-versed with best practices around Data Governance, Data Stewardship and overall Data Quality initiatives Inventories existing data design, including data flows and systems, Designs data Model that integrates new and existing environment, Develops Conducts architecture meetings with Team leads and Solution Architects to communicate complex ideas, issues, and system processes along with Architecture discussions in their current projects Design the data warehouse and provide guidance to the team in implementation using Snowflake SnowSQL. Good hands-on experience in converting Source Independent Load, Post Load Process, Stored Procedure, SQL to Snowflake Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the snow pipelines and should be able to trouble shoot the issue quickly 1 yr Work experience in DBT. Data modeling and data integration. Advance SQL skills for analysis, standardizing queries Mandatory Skillset: Snowflake, DBT, Data Architecture Design experience in Data Warehouse.

Posted 2 weeks ago

Apply

14.0 - 24.0 years

14 - 24 Lacs

Mumbai, Maharashtra, India

On-site

Foundit logo

Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.

Posted 2 weeks ago

Apply

14.0 - 24.0 years

14 - 24 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.

Posted 2 weeks ago

Apply

14.0 - 24.0 years

14 - 24 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.

Posted 2 weeks ago

Apply

14.0 - 24.0 years

14 - 24 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.

Posted 2 weeks ago

Apply

14.0 - 24.0 years

14 - 24 Lacs

Noida, Uttar Pradesh, India

On-site

Foundit logo

Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.

Posted 2 weeks ago

Apply

14.0 - 24.0 years

14 - 24 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies